Leveraging Kullback–Leibler Divergence Measures and Information-Rich Cues for Speech Summarization

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Leveraging Effective Query Modeling Techniques for Speech Recognition and Summarization

Statistical language modeling (LM) that purports to quantify the acceptability of a given piece of text has long been an interesting yet challenging research area. In particular, language modeling for information retrieval (IR) has enjoyed remarkable empirical success; one emerging stream of the LM approach for IR is to employ the pseudo-relevance feedback process to enhance the representation ...

متن کامل

Improved speech summarization with multiple-hypothesis representations and kullback-leibler divergence measures

Imperfect speech recognition often leads to degraded performance when leveraging existing text-based methods for speech summarization. To alleviate this problem, this paper investigates various ways to robustly represent the recognition hypotheses of spoken documents beyond the top scoring ones. Moreover, a new summarization method stemming from the Kullback-Leibler (KL) divergence measure and ...

متن کامل

Relative Divergence Measures and Information Inequalities

There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber’s [17] relative information and Jeffreys [16] J-divergence, Information radius or Jensen difference divergence measure due to Sibson [23]. Burbea and Rao [3, 4] has also found its applications in the literature. Taneja [25] studied anoth...

متن کامل

Information Divergence Measures and Surrogate Loss Functions

In this extended abstract, we provide an overview of our recent work on the connection between information divergence measures and convex surrogate loss functions used in statistical machine learning. Further details can be found in the technical report [7] and conference paper [6]. The class of f -divergences, introduced independently by Csiszar [4] and Ali and Silvey [1], arise in many areas ...

متن کامل

Some inequalities for information divergence and related measures of discrimination

Inequalities which connect information divergence with other measures of discrimination or distance between probability distributions are used in information theory and its applications to mathematical statistics, ergodic theory and other scientific fields. We suggest new inequalities of this type, often based on underlying identities. As a consequence we obtain certain improvements of the well...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Audio, Speech, and Language Processing

سال: 2011

ISSN: 1558-7916,1558-7924

DOI: 10.1109/tasl.2010.2066268